Rajarshi Muhuri
rmuhuri@gmail.com
Professional Summary
·
·
·
·
·
·
Education
·
·
+1 (469) 662-1163
4 years of SAP HANA and 5 year of SAP BW experience
3 year of Data Services, BOBJ/BI tools experience
6 months of HADOOP, Predictive analysis experience
3 years of Nuclear Engineering and Core Design Experience
8 years of SAP experience
12 years of industry experience
Indian Institute of Technology (Kharagpur) Bachelor of Science (Honors) in Chem
Indian Institute of Technology (Kharagpur) Master of Science (Honors) in Chem
· University of Cincinnati (Ohio)
Certification
· Westinghouse Core Design and Reactor Physics – (Nuclear Design 01)
· SAP Certified Application Associate - SAP HANA 1.0
· SAP Certified Application Professional – SAP HANA 1.0
Master of Science in Nuclear Engineering
1998
2000
2004
Work status:
US citizen
Experience Matrix:
Technology
SAP HANA / HADOOP / BODS / Design Studio / BOBJ
SAP HANA / HADOOP / SAP BODS / Lumira / Predictive Analytics
SAP HANA / XS Engine / Predictive Analysis / Lumira / SAPUI5
SAP HANA / SAP BODS / SAP BW on HANA / BOBJ
SAP HANA / SAP BW / SAP BODS
SAP BW / SAP BODS
Reactor Physics And Core Design
SAP PI
.NET / C# / MS SQL
CASMO / SIMULATE / BWR Core Design / FORTRAN
Profile
HANA Architect
SAP Big Data Architect
HANA Architect
HANA Architect
HANA Modeler
SAP BW Developer
Nuclear Engineer
Developer
Nuclear Intern
End Client
Merit Energy
Phillips66
(Oil and Gas)
Shell
(Oil and Gas)
Verizon
(Telecom)
Vodafone UK
(Telecom)
Monsanto
(Chemicals)
Westinghouse
(Nuclear)
Senior Systems Engineer Rockwell
(Automation)
St Paul
(Insurance)
First Energy
(Nuclear)
Employer
Independent
Accenture
Accenture
Accenture
Accenture
Accenture
Westinghouse
IBM
Cognizant
UC / FENOC
Duration
8 months
5 months
3 months
2 years
5 months
6 months
3 years
2 year
1 year
1 year
Detailed Work Experience
Merit Energy Company (HANA Architect / Developer)
Jul 2014 – Present
Merit Energy Company uses Quorum software to administer their land mapping and oil production tied with SAP financial data. Quorum Company
also sells analytics based on SAP BOBJ to their many clients. However for years their performance had been very substandard and in some cases
their measures wrongly doubled up. This issue had been plaguing multiple quorum clients. Merit wanted to independently rework the solution on the
HANA platform and also to eliminate the measures multiplying wrongly. Merit had hired a specialized HANA firm in 2012 do this as a POC. The 4
member team delivered on HANA in 3 months, however, the performance was not up to the mark and the measures still wrongly multiplied. I came
on board in July 2014 and within a month delivered a solution that was blazing fast and the measures added up correctly. The classic problem of
multivalued dimensions was solved by using HANA windows functions.
· Data Services was used to bring the data from ORACLE DB
· Much of the logic was ported into Data Services combining many ORACLE tables into sets of single HANA tables.
· HANA Window Functions was used to circumvent the earlier issue of doubling measures.
· The front end was delivered using BOBJ universe and WEBI.
· Existing reports on MS SQL is being converted to HANA models and Design studio replacing Xcelsius Dashboards
· Ongoing work is reworking classical SAP BW to optimized mix modeling BW on HANA.
· BW Planning Functions logic embedded in ABAP is being modified to SQL scripts.
·
·
Installed and Maintained SAP Data Services and SAP HANA including writing installation guides.
Involved in SAP BPC development.
Accenture (Role: SAP Big Data Architect)
Phillips66
Feb 2014 – Sep2014
SAP HANA, Sybase IQ and Hortonworks HADOOP are being used in conjunction with Predictive analysis. PAL algorithms are being used to study
for outlier analysis, preventive maintenance by using the HANA time-series, classification and clustering algorithms. Data is fed in HADOOP via
DataServices in nightly batches. HADOOP uses map reduce via (HIVE/PIG) to extract, enrich the relevant data into tabular format within HADOOP.
HANA pulls HADOOP data using both federated and within HANA depending on the use case. For reports and analysis, data is bought directly in
HANA but is federated via SDA for predictive analysis. 3 Tier systems with HANA for HOT data, IQ for WARM data and HADOOP for Data
Dump is being Architected.
·
Oil flow logs and sensor data is bought into HADOOP HDFS via Data Services.
· HIVE is used to parse the log data to extract key flow related parameters like flow rate, viscosity, co-efficient’ s and extract it into ORC
format (Optimized Row Columnar).
· Sensor data is also formatted into a tabular structure within HDFS with key data like pressure, temperature, bore diameter etc.
· Ozzie is used to drive the entire workflow within HADOOP
· Data is pulled into HANA via nightly jobs by data services and modeled as views combining fact table from HADOOP with dimension
tables from HANA.
· SDA is used to pull data from HADOOP (Federated approach) to model predictive analysis algorithms and visualizations in Lumira.
· Modifying HANA Live Content.
Accenture (Role: HANA Architect)
Shell
Nov 2013 – Jan 2014
WRFM plant maintenance data was captured via third party tools and via SAP PM Notifications. The data was SLT’ed into HANA and models built
for reporting. Predictive capabilities of HANA were used to do exploratory analysis of the data using Lumira as the front end display tool. The drill
down to specific component details was achieved by creating OData services on HANA tables with SAP UI5 as the front end.
· Developed Scripted Calculation views with embedded 'Read-Only' Stored Procedures.
·
Created and consumed ODATA Rest Services on HANA tables and consumed ODATA from UI5.
· Co related hypothesis for Plant issues with PAL algorithms
· Distribution by defect hypothesis to Classification algorithm , defect frequency to Extrapolation algorithm
· Ran in database Predictive Algorithms in HANA and used Lumira to display simple results and co relations.
Accenture (Role: HANA Architect)
Verizon
Jan 2012 – Oct 2013
HANA Phase II: included creation of Financial Close models from SAP and non-SAP data to create BOBJ reports using WeBI and Analysis for
office. There was Real Time Data Provisioning via SLT from ECC to HANA and from ORACLE to HANA. Another component was to send GL
summary data from HANA models to an external CPM system via Data Services. This was to achieve a 360 degree reporting.
HANA Phase III: included a mix mode SAP HANA and SAP BW on HANA approach where there were standalone HANA models, HANA models
exposed as Virtual Providers in SAP BW and finally classical BW on HANA models. The latter made use of in-Memory Optimized DSOs exposed
thru Multi-Providers and BEX Queries and use of in-memory optimized cubes when aggregating data with different granularity.
HANA-HADOOP Phase I: Hadoop win conjunction with HANA is being used to evaluate Verizon call drops over the country. The Data would be
used to identify the moving customer base density and bettering the telecom infrastructure on most commonly used routes.
• Designed and Architected solution for HANA Phase II and Phase III Implementations
• Developed HANA models and BODS flows for Phase II and Phase III.
•
Created HANA and BW on HANA mixed models using Virtual providers.
• Moved SAP BW specific models to hybrid BW on HANA and pure HANA paradigm.
• Developed custom ABAP SLT for prepopulating data for performance optimization
• Found novel workarounds to multiple Problems/Issues.
•
Created re-usable methodology for Recursive Hierarchy, custom SLT enhancements, HANA Performance Optimization techniques, HANA
Stored Procedure calls by Data Services.
Key Strategies used to overcome limitations of HANA and to improve performance
• Trigger stored procedures and data push via Data Services due to Lack of scheduling at the HANA DB level.
•
Introduced lag in the incremental feed to counter latencies in data replication via real time SLT.
• Developed Write enabled Stored Procs initiated by data services that would do the join on data chunked into smaller subsets and populating
a table to counter crashing of HANA on joining Massive Tables.
• STL transformations were used to pre populate the tables with additional fields with the complex logic already built into the fields to
counter performance degradation due to nested complex logic and IF statements in HANA models.
• Alter Scripts were used to create auto-generated columns to push calculations down to the HANA DB level and to convert and persisted
Data-types needed for the joins. This was done to do join on different data types from disparate sources.
• Enable decimal shift feature in the HANA models and model time travel.
• Advanced Parent-child Hierarchy modelling, and using SPs to flatten and prepopulate the hierarchy data.
• Event based SLT transformation to counter parameter based SLT limitation of 3 input parameters.
•
Eliminating Massive Joins by Doing Trick Unions with Constants to improve performance.
Accenture (Role: BW on HANA Developer)
Vodafone UK
Aug 2011 – Dec 2011
Vodafone was one of the early implementers of SAP HANA. The POC was a joint effort by Accenture and SAP
Responsibilities:
• Was involved in Design and Development of the HANA POC/Phase I for Vodafone UK.
• Multiple HANA models were explored and optimized models was chosen.
• Moved multiple SAP BW models into pure HANA models
• Final solution showed 20X performance improvement to bring the solution to less than 1 min from an hour.
The Monsanto Company
Role: SAP BW Consultant
Jan 11 – Jul 11
Better BI Initiative: This was a BI project that involved SAP BI, SAP MDM, SAP Business Objects Data Services 4.0. The effort involved
architecting a MDM repository, that had cleansed and non duplicated master data, SAP BI would consume this master data in the Phase 1 in order to
do more accurate reporting.
Responsibilities:
SAP BW 7.0
·
Identified the Standard content and installed the same for Data Model design and extraction
· Designed the data model and identified the data load dependencies for FI Modules
·
Created Info Cube, DSO, Info objects that are involved in modeling
· Customizing Info Objects, Info Cube, DSO, BEx Queries as per user requirement
· Developed Queries and Workbooks using BEx Analyzer, using filters, variables, Calculated Key figures, navigational attributes
· Worked on Routines at object level and Data source enhancement to include customized field into the Standard Data sources
· Routine Data Loading Activities as per the decided schedule
· Worked on process chain to automate the data flow
· SAP Business Objects Web Intelligence Report development
· Universe creation using SAP Business Objects Universe Designer
SAP BO Data Services 4.0
· Used Data Integrator services for moving data between xml, file , RDBMS, and SAP backend and MDM.
· Used CDC technique of unlimited data preservation by using surrogate keys for Delta BI load.
· Used various transformations to extract and enrich data before loading to target.
Westinghouse Nuclear
Role: PWR Nuclear Engineer
Mar 2008 – Dec 2010
Westinghouse Electric Company provides fuel, services, technology, plant design, and equipment for the commercial nuclear electric power industry.
Westinghouse Nuclear is the world's pioneering nuclear power Company (including the world’s first commercial nuclear reactor) and is a leading
supplier of nuclear plant products and technologies to utilities throughout the world.
Nuclear Engineer (Westinghouse Advanced Core Design and Reactor Physics Group)
· Westinghouse ND01 (Nuclear Design 01) qualified.
Responsibilities:
· Advanced Reactor Design and Analysis.
· Multi cycle optimized Core Design and Reactor Physics.
· Loading Pattern (LP) Development (using ANC 8) and LP Risk Analysis (LPRA)
· Reactor Safety and Analysis Criteria (RSAC).
· Nuclear Design Reports and Core follow.
· Module development for high performance parallel processing BEACON code.
· ALPHA/PHEONIX multivariable cross section generation.
· Engineering Analysis including core neutronics, Thermal Hydraulics (T/H).
IBM (Role: Senior Systems Engineer)
Rockwell Automation
Mar 2006 – Feb2008
Middleware Migration Initiative: Earlier Webmethods was used as a middleware. The current effort is to transfer all the integrations to the SAP PI
space. It has a SOA based architecture utilizing Oracle Service Bus (OSB). All WEC and external party B2B communications were orchestrated via
the OSB. The OSB acted as both sender and consumer of webservices.
Responsibilities:
SAP PI 7.0/7.1
· Acted as a Team Lead for the Middleware Migration Project that involved moving 76 interfaces.
· Extensively used CIDX - IDOC/RFC/FILE/JDBC/SOAP data interchange - for uniform data exchange.
·
·
·
Developed global UDFs for enhancing /automating mappings.
Extensively used SOAP sender / Receiver adapters to channel interchanges via OSB for webservice routing.
Encapsulated CIDX as payload within a SOAP envelope, to pass parameters not available in CIDX.
· Developed Generic ESR objects that used value mapping and dynamic enhanced receiver determinations.
· Developed conditional multi receiver based Order Create, Order Response, and Order Change IDOC - CIDX scenarios.
·
Created alerts for all 76 MM interfaces.
· Extensively used Seeburger PGP encryption module and SFTP for handling highly secure transactions
Cognizant (Role: Senior Associate)
St Paul Travelers Indemnity
Feb 2005 – Jul 2006
The Enterprise Cross Sell application was developed to provide reports and analysis tracking on products cross sold between the Bond Executive
Liability and Commercial Account units of St Paul Travelers. This application impacts the business by providing methods to better understand /
manage risk and helping to make the transition from a business run on estimation to measurement.
Responsibilities:
·
Involved as a Business Analyst to develop the Management Information System software.
·
Interacting with Customers/Business Users/ End users to elicit functional objectives and requirement gathering.
· Application Development of .NET N-tier application using C#/ Visual studio 2003 and SQL Server as backend.
· Developing WIN Forms to provide a rich front end user screens.
· Used SQL Serve to design/develop database objects including tables, triggers, views and stored procedures.
Core Design and Physics Support Group (Nuclear Engineering Internship)
First Energy Nuclear / UC
Jan 2004 - Dec 2004
Davis Besse is a B&W Pressurized Water Reactor owned by First Energy Nuclear. During transients, the reactor undergoes Power Oscillations,
which lead to instabilities.
· Was involved in developing/modifying a computational 1D core model (TANC) in FORTRAN for predicting axial power imbalance and
other parameters for B&W PWR.
· Development of a user friendly GUI in C#.NET that communicated with the FORTRAN neutronics code.
· TANC benchmark of Davis Besse (B&W PWR) Cycle 14.
BWR Core Uprate
· Up-rating the Perry Nuclear Power Plant (GE BWR) power by 18% (to 4223 MWth) in conjunction with an extension of its fuel cycle
length from 18 to 24 months.
· CASMO-3/SIMULATE-3 package along with custom FORTRAN code was used to develop the bundle and core designs necessary to
achieve the noted goals.
· Equilibrium core design with an enhanced spectral shift approach is employed whereby rod patterns and flow are combined in such a way
as to promote a strongly bottom peaked axial power profile.